Skip to main content

Methodology

This document outlines the methodology for implementing an AI assistant. It covers the stages, application process, tools used in each step, and the stakeholders involved.

Stages of Implementation

1. Preparation

The implementation process of an assistant begins with a preparation stage where the objectives of the assistant, stakeholders, information groups to be included, system integrations, and client communication channels are defined. It is crucial to involve business analysts and functional teams early on to thoroughly understand the challenges the assistant is expected to address.

2. Implementation Process

The implementation process is carried out in information groups. Each group goes through the stages of development, testing, and production. The initiation of each track is staggered to maximize the efficiency of the teams working at each stage.

Other tasks, such as developing specific integrations, custom tools, or personalizing communication channels with the client, are performed in parallel to the lifecycle of the information groups.

2.1 Tasks During Development

Initial configurations of the assistant are made, and documents from the information group are ingested. The functional team provides feedback on responses to make adjustments, gathers common questions with their expected answers, and defines criteria for acceptable responses.

Implementers adjust the assistant by fine-tuning content, retrieval configurations, flow designs, tool settings, response styles, predefined questions, and lessons.

2.2 Tasks During Testing

Functional tests are conducted to identify errors and improvements. Evaluations using the evaluator tool are also performed to ensure performance before moving to production.

2.3 Tasks During Production

In this environment, interactions are monitored, and user feedback is collected and analyzed through the real-time monitoring module. Periodic automated evaluations using the evaluator tool are conducted to ensure the environment's performance. Additionally, semantic analysis can be performed on both uploaded documents and assistant conversations through the analyzer module, enabling quality assessment and continuous improvement.

All of these activities can be managed through the implementation portal.